What is the meaning of the word "film noir"?
Definitions:
-
a movie that is marked by a mood of pessimism, fatalism, menace, and cynical characters noun
- film noir was applied by French critics to describe American thriller or detective films in the 1940s